Quantum RAM Based Neural Netoworks
نویسنده
چکیده
A mathematical quantisation of a Random Access Memory (RAM) is proposed starting from its matrix representation. This quantum RAM (q-RAM) is employed as the neural unit of q-RAM-based Neural Networks, q-RbNN, which can be seen as the quantisation of the corresponding RAM-based ones. The models proposed here are direct realisable in quantum circuits, have a natural adaptation of the classical learning algorithms and physical feasibility of quantum learning in contrast to what has been proposed in the literature. 1 Quantum computation and Mathematical Quantisation Quantum computing [14] was originally proposed by Richard Feynman [7] in the 1980s and had its formalisation with David Deutsch which proposed the quantum Turing machine [4]. Quantum Computing has been popularised through the quantum circuit model [5] which is a quantisation [15] of the classical boolean circuit model of computation. Quantum computer also became a potential parallel device for improving the computational efficiency of neural networks [6]. The quantum information unit is the quantum bit or ”qubit”. A very intuitive view of the quantisation procedure is put forward by Nik Weaver in the Preface of his book Mathematical Quantization [15] with just a phrase which says it all: ”The fundamental idea of mathematical quantisation is sets are replaced with Hilbert spaces”. The quantisation of the boolean circuit logic starts by simply embedding the the classical bits {0, 1} in a convenient Hilbert space. The natural way of doing this is to represent them as (orthonormal) basis of a Complex Hilbert space. In this context these basis elements are called the computational-basis states.Linear combinations (from Linear Algebra [9]) of the basis spans the whole space whose elements, called states, are said to be in superposition. Any basis can be used (recall from Linear Algebra [9] that there usually are many!). But in Quantum Computing it is customary to use the most conventional and well known one: |0〉, |1〉 are a pair of orthonormal basis vectors representing each classical bit, or ”cbit”, as column vector, |0〉 = [1 0]T and |1〉 = [0 1]T. A general state of the system (a vector) can be written as: |ψ〉 = α |0〉+β |1〉, where α, β are complex coefficients (called probability amplitudes) constrained by the normalization condition: |α| + |β| = 1. This is the model of one qubit. Multiple qubits are obtained via tensor products. By linearity we just need to say how tensor behaves on the basis: |i〉 ⊗ |j〉 = |i〉 |j〉 = |ij〉 , where i, j ∈ {0, 1} If that were all the story, quantum computing would be just a trivial extension of classical computing. But Quantum Mechanics Principles [14] restrain the kind of permissible operations. Operations on qubits are carried out only by unitary operators (i.e. ∗Supported by MCT-CNPq and PRONEX/FACEPE. On sabbatical leave at the School of Computer Science, University of Birmingham ESANN'2009 proceedings, European Symposium on Artificial Neural Networks Advances in Computational Intelligence and Learning. Bruges (Belgium), 22-24 April 2009, d-side publi., ISBN 2-930307-09-9.
منابع مشابه
Vector space weightless neural networks
By embedding the boolean space Z2 as an orthonormal basis in a vector space we can treat the RAM based neuron as a matrix (operator) acting on the vector space. We show how this model (inspired by our research on quantum neural networks) is of sufficient generality as to have classical weighted (perceptronlike), classical weightless (RAM-based, PLN, etc), quantum weighted and quantum weightless...
متن کاملOutlier Detection Using Extreme Learning Machines Based on Quantum Fuzzy C-Means
One of the most important concerns of a data miner is always to have accurate and error-free data. Data that does not contain human errors and whose records are full and contain correct data. In this paper, a new learning model based on an extreme learning machine neural network is proposed for outlier detection. The function of neural networks depends on various parameters such as the structur...
متن کاملAdvances on Weightless Neural Systems
Random Access Memory (RAM) nodes can play the role of artificial neurons that are addressed by Boolean inputs and produce Boolean outputs. The weightless neural network (WNN) approach has an implicit inspiration in the decoding process observed in the dendritic trees of biological neurons. An overview on recent advances in weightless neural systems is presented here. Theoretical aspects, such a...
متن کاملStorage Capacity of Ram-based Neural Networks: Pyramids
Recently the authors developed a modular approach to assess the storage capacity of RAM-based neural networks 1] that can be applied to any architecture. It is based on collisions of information during the learning process. It has already been applied to the GNU architecture. In this paper, the technique is applied to the pyramid. The results explain practical problems reported in the literatur...
متن کاملConvenient and Efficient Elimination of Heavy Metals from Wastewater Using Smart Pouch with Biomaterial
A newly developed Smart Pouch with enclosed biomaterial (Aloe vera and coconut husk powder) has been experimented for elimination of heavy metals i.e. (Pb2+, Cu2+, Ni2+ and Zn2+) from wastewater. The effect of concentration, pH, temperature, contact duration etc. was investigated using batch experiments which resulted that the Pouch may be accepted for convenient, efficient and low-cost accumul...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2009